Mitigating Catastrophic Forgetting with Complementary Layered Learning

نویسندگان

چکیده

Catastrophic forgetting is a stability–plasticity imbalance that causes machine learner to lose previously gained knowledge critical for performing task. The occurs in transfer learning, negatively affecting the learner’s performance, particularly neural networks and layered learning. This work proposes complementary learning technique introduces long- short-term memory reduce negative effects of catastrophic forgetting. In particular, this dual system non-neural network approaches evolutionary computation Q-learning instances because these techniques are used develop decision-making capabilities physical robots. Experiments evaluate new augmentation multi-agent simulation, where autonomous unmanned aerial vehicles learn collaborate maneuver survey an area effectively. Through direct-policy value-based experiments, proposed demonstrated significantly improve task performance over standard successfully balancing stability plasticity.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Mitigating Catastrophic Forgetting in Temporal Difference Learning with Function Approximation

Neural networks have had many great successes in recent years, particularly with the advent of deep learning and many novel training techniques. One issue that has prevented reinforcement learning from taking full advantage of scalable neural networks is that of catastrophic forgetting. The latter affects supervised learning systems when highly correlated input samples are presented, as well as...

متن کامل

Alleviating Catastrophic Forgetting via Multi-Objective Learning [IJCNN1762]

Handling catastrophic forgetting is an interesting and challenging topic in modeling the memory mechanisms of the human brain using machine learning models. From a more general point of view, catastrophic forgetting reflects the stability-plasticity dilemma, which is one of the several dilemmas to be addressed in learning systems: to retain the stored memory while learning new information. Diff...

متن کامل

Block Neural Network Avoids Catastrophic Forgetting When Learning Multiple Task

In the present work we propose a Deep Feed Forward network architecture which can be trained according to a sequential learning paradigm, where tasks of increasing difficulty are learned sequentially, yet avoiding catastrophic forgetting. The proposed architecture can re-use the features learned on previous tasks in a new task when the old tasks and the new one are related. The architecture nee...

متن کامل

Catastrophic forgetting in connectionist networks.

All natural cognitive systems, and, in particular, our own, gradually forget previously learned information. Plausible models of human cognition should therefore exhibit similar patterns of gradual forgetting of old information as new information is acquired. Only rarely does new learning in natural cognitive systems completely disrupt or erase previously learned information; that is, natural c...

متن کامل

Catastrophic Forgetting, Rehearsal and Pseudorehearsal

This paper reviews the problem of catastrophic forgetting (the loss or disruption of previously learned information when new information is learned) in neural networks, and explores rehearsal mechanisms (the retraining of some of the previously learned information as the new information is added) as a potential solution. We replicate some of the experiments described by Ratcliff (1990), includi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Electronics

سال: 2023

ISSN: ['2079-9292']

DOI: https://doi.org/10.3390/electronics12030706